Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(core): log censoring is slow if response has many URLs #524

Closed
wants to merge 2 commits into from

Conversation

eliangcs
Copy link
Member

@eliangcs eliangcs commented Apr 1, 2022

Fixes an issue where log censoring becomes too slow when an app gives a response that has many (like thousands) URLs. The new test case should be self-explanatory.

@eliangcs eliangcs requested a review from xavdid as a code owner April 1, 2022 06:00
Copy link
Contributor

@xavdid xavdid left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, so the salient change here is that instead of using findSensitiveValues(data) (which finds sensitive keys and urls w/ querystrings), we instead only find sensitive keys (under the assumption that the logged body won't contain any new secrets not already present in authData.

I'm a little wary of censoring fewer things, but that does seem like a corner case. All else equal, I'd have focused on speed improvements in secret scrubber (if possible), but I think this is a reasonable first step.

Thanks for digging in here!

json: {
data: [
// An app could have thousands of URLs in a response. We don't
// want these to be in the sensitive vallues and make secret
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo vallues

@eliangcs
Copy link
Member Author

eliangcs commented Apr 4, 2022

@xavdid I was wondering why we were censoring all the URLs (with query params) from the log extra in the first place. Did we do that to fix another issue where logs weren't censored enough? If yes, merging this PR would create another regression (where logs are not censored enough).

I'd have focused on speed improvements in secret scrubber (if possible)

I also think fixing in secret-scrubber is better. Can you do it? We don't have to rush to merge this PR, since we can downgrade an app to 9.7.3 to fix the timeout issue.

@xavdid
Copy link
Contributor

xavdid commented Apr 4, 2022

Did we do that to fix another issue where logs weren't censored enough?

Yes, I think in a couple of different places. Especially during auth refresh, urls will have query params that need to be censored (but have new values that aren't in authData, since they're novel tokens). I think #85 is one such example, but I bet there are more in the monorepo.

I think it's fair to exclude the request body from checking sensitive urls - that does seem like a corner case. But, we should probably keep checking them for auth requests and in messages itself.

I think in interest of not causing regressions, downgrading affected apps for now is the way to go and I can take a pass at profiling and improving the url censoring in scrubber. If I can't get to the bottom of it in a day or two, we can move forward here and adjust if we notice any regressions.

// extra should be safe.
const sensitiveLogData = recurseExtract(
data,
(key, value) => _.isString(value) && isSensitiveKey(key)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sensitive key -> numeric value should also be caught. eg {secretNumber: 123456789}

@xavdid
Copy link
Contributor

xavdid commented Apr 6, 2022

fixed in secret scrubber, closing

@xavdid xavdid closed this Apr 6, 2022
@xavdid xavdid deleted the PDE-3138-log-censoring-many-urls branch April 6, 2022 19:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants